Guarantees of Augmented Trace Norm Models in Tensor Recovery

نویسندگان

  • Ziqiang Shi
  • Jiqing Han
  • Tieran Zheng
  • Ji Li
چکیده

This paper studies the recovery guarantees of the models of minimizing ‖X‖∗ + 1 2α‖X‖F where X is a tensor and ‖X‖∗ and ‖X‖F are the trace and Frobenius norm of respectively. We show that they can efficiently recover low-rank tensors. In particular, they enjoy exact guarantees similar to those known for minimizing ‖X‖∗ under the conditions on the sensing operator such as its null-space property, restricted isometry property, or spherical section property. To recover a low-rank tensor X , minimizing ‖X‖∗ + 1 2α‖X‖F returns the same solution as minimizing ‖X‖∗ almost whenever α ≥ 10max i ‖X0 (i)‖2. Index Terms Tensor norms, augmented model, convex optimization, low-rank recovery, restricted isometry property.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Scalable and Sound Low-Rank Tensor Learning

Many real-world data arise naturally as tensors. Equipped with a low rank prior, learning algorithms can benefit from exploiting the rich dependency encoded in a tensor. Despite its prevalence in low-rank matrix learning, trace norm ceases to be tractable in tensors and therefore most existing works resort to matrix unfolding. Although some theoretical guarantees are available, these approaches...

متن کامل

Regularized Orthogonal Tensor Decompositions for Multi-Relational Learning

Multi-relational learning has received lots of attention from researchers in various research communities. Most existing methods either suffer from superlinear per-iteration cost, or are sensitive to the given ranks. To address both issues, we propose a scalable core tensor trace norm Regularized Orthogonal Iteration Decomposition (ROID) method for full or incomplete tensor analytics, which can...

متن کامل

Theoretical and Experimental Analyses of Tensor-Based Regression and Classification

We theoretically and experimentally investigate tensor-based regression and classification. Our focus is regularization with various tensor norms, including the overlapped trace norm, the latent trace norm, and the scaled latent trace norm. We first give dual optimization methods using the alternating direction method of multipliers, which is computationally efficient when the number of trainin...

متن کامل

Exact Tensor Completion from Sparsely Corrupted Observations via Convex Optimization

This paper conducts a rigorous analysis for provable estimation of multidimensional arrays, in particular third-order tensors, from a random subset of its corrupted entries. Our study rests heavily on a recently proposed tensor algebraic framework in which we can obtain tensor singular value decomposition (t-SVD) that is similar to the SVD for matrices, and define a new notion of tensor rank re...

متن کامل

A New Convex Relaxation for Tensor Completion

We study the problem of learning a tensor from a set of linear measurements. A prominent methodology for this problem is based on a generalization of trace norm regularization, which has been used extensively for learning low rank matrices, to the tensor setting. In this paper, we highlight some limitations of this approach and propose an alternative convex relaxation on the Euclidean ball. We ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2013